A Second Derivative SQP Method: Local Convergence and Practical Issues
نویسندگان
چکیده
Gould and Robinson [SIAM J. Optim., 20 (2010), pp. 2023–2048] proved global convergence of a second derivative SQP method for minimizing the exact 1-merit function for a fixed value of the penalty parameter. This result required the properties of a so-called Cauchy step, which was itself computed from a so-called predictor step. In addition, they allowed for the additional computation of a variety of (optional) accelerator steps that were intended to improve the efficiency of the algorithm. The main purpose of this paper is to prove that a nonmonotone variant of the algorithm is quadratically convergent for two specific realizations of the accelerator step; this is verified with preliminary numerical results on the Hock and Schittkowski test set. Once fast local convergence is established, we consider two specific aspects of the algorithm that are important for an efficient implementation. First, we discuss a strategy for defining the positive-definite matrix Bk used in computing the predictor step that is based on a limited-memory BFGS update. Second, we provide a simple strategy for updating the penalty parameter based on approximately minimizing the 1-penalty function over a sequence of increasing values of the penalty parameter.
منابع مشابه
A Second Derivative Sqp Method : Theoretical Issues
Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding th...
متن کاملA Second Derivative Sqp Method : Local Convergence
In [19], we gave global convergence results for a second-derivative SQP method for minimizing the exact l1-merit function for a fixed value of the penalty parameter. To establish this result, we used the properties of the so-called Cauchy step, which was itself computed from the so-called predictor step. In addition, we allowed for the computation of a variety of (optional) SQP steps that were ...
متن کاملA Second Derivative SQP Method: Global Convergence
Gould and Robinson (NAR 08/18, Oxford University Computing Laboratory, 2008) gave global convergence results for a second-derivative SQP method for minimizing the exact l1-merit function for a fixed value of the penalty parameter. To establish this result, we used the properties of the so-called Cauchy step, which was itself computed from the so-called predictor step. In addition, we allowed fo...
متن کاملA Second Derivative Sqp Method with Imposed
Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding th...
متن کاملA Second-derivative Trust-region Sqp Method with a “trust-region-free” Predictor Step
In (NAR 08/18 and 08/21, Oxford University Computing Laboratory, 2008) we introduced a second-derivative SQP method (S2QP) for solving nonlinear nonconvex optimization problems. We proved that the method is globally convergent and locally superlinearly convergent under standard assumptions. A critical component of the algorithm is the so-called predictor step, which is computed from a strictly ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 20 شماره
صفحات -
تاریخ انتشار 2010